Information-theoretic lower bounds for convex optimization with erroneous oracles

نویسندگان

  • Yaron Singer
  • Jan Vondrák
چکیده

We consider the problem of optimizing convex and concave functions with access to an erroneous zeroth-order oracle. In particular, for a given function x → f(x) we consider optimization when one is given access to absolute error oracles that return values in [f(x) − , f(x) + ] or relative error oracles that return value in [(1− )f(x), (1 + )f(x)], for some > 0. We show stark information theoretic impossibility results for minimizing convex functions and maximizing concave functions over polytopes in this model.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Information-theoretic lower bounds on the oracle complexity of convex optimization

Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. Given the extensive use of convex optimization in machine learning and statistics, gaining an understanding of these complexity-theoretic issues is important. In this paper, we study the complexity of stochastic convex optimization ...

متن کامل

Tight Complexity Bounds for Optimizing Composite Objectives

We provide tight upper and lower bounds on the complexity of minimizing the average of m convex functions using gradient and prox oracles of the component functions. We show a significant gap between the complexity of deterministic vs randomized optimization. For smooth functions, we show that accelerated gradient descent (AGD) and an accelerated variant of SVRG are optimal in the deterministic...

متن کامل

Lower Bounds for Convex Optimization with Stochastic Oracles

We first formalize stochastic optimization in the oracle-versus-optimizer paradigm (Nemirovski and Yudin, 1983) in Section 1, and then sketch the state-of-the-art upper and lower bounds for the rate of convergence (Agarwal et al., 2009) in Section 2. Intuitively, they show that there exists a firstorder stochastic oracle (which returns a noisy version of the gradient with zero mean and bounded ...

متن کامل

Information-Theoretic Lower Bounds on the Oracle Complexity of Sparse Convex Optimization

Relative to the large literature on upper bounds on complexity of convex optimization, lesser attention has been paid to the fundamental hardness of these problems. Recent years have seen a surge in optimization methods tailored to sparse optimization problems. In this paper, we study the complexity of stochastic convex optimization in an oracle model of computation, when the objective is optim...

متن کامل

Level bundle methods for constrained convex optimization with various oracles

We propose restricted memory level bundle methods for minimizing constrained convex nonsmooth optimization problems whose objective and constraint functions are known through oracles (black-boxes) that might provide inexact information. Our approach is general and covers many instances of inexact oracles, such as upper, lower and on-demand accuracy oracles. We show that the proposed level bundl...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015